[Journalism Internship] Korea’s 'AI Basic Act': An opportunity or an obstacle?
YUN JI-MIN, SOPHIA CHO, JO AH-YOUNG, LEE SEUNG-JOON
The National Assembly passes the AI Basic Act during a plenary session in Yeouido, western Seoul, on Dec. 26, 2024, with 260 votes in favor, one against and three abstentions out of 264 lawmakers present. [NEWS1]
Korea has globally paved the path to AI governance by being the first country to formally promote the safe usage of AI through the enactment of the so-called AI Basic Act.
AI is becoming increasingly significant in different industries and in daily life due to its role as a transformative "general purpose technology," driving innovation and enhancing productivity. The AI Basic Act was created in order to prevent disorder caused by irresponsible usage of AI, positioning Korea as a “global leader in trustworthy and innovative AI,” according to the International Trade Administration.
Yet, despite this policy’s initial purpose, it has not been able to settle the confusion arising among businesses and is causing division within the tech industry, according to experts.
The basics of the AI bill
Ryu Je-myung, second vice minister of science and ICT, speaks during an experts’ roundtable for the AI Basic Act support desk hosted by the Korea Artificial Intelligence Software Industry Association in Songpa District, southern Seoul, on Jan. 22. [MINISTRY OF SCIENCE AND ICT]
The AI Basic Act took effect on Jan. 22, making Korea the first country to implement a complete law on AI governance.
The act requires companies and AI developers to take greater responsibility when addressing fake content or misinformation by notifying users in advance if AI was used.
Companies that develop high-impact AI — systems that significantly affect human life — must provide explanations of the outcomes and training data, create user protection plans, utilize supervision and document measures taken to ensure trust and safety. Videos or photos utilizing AI, for example, must use watermarks to prevent deception.
The act is designed to establish a national governance framework of AI, systematically foster the AI industry and pre-emptively prevent potential risks associated with AI, according to the Ministry of Science and ICT.
Yoo Sang-in, the Minister of Science and ICT, said, “We consider the passage of the basic act on artificial intelligence in the National Assembly to be highly significant as it will lay the foundation for strengthening the country's AI competitiveness. Amid the intense global competition for AI, enactment of this AI Basic Act is a crucial milestone for Korea to truly take a leap forward as one of the world's top three AI powerhouses, resolving corporate uncertainty and stimulating large-scale public-private investment.”
Backlash to the bill
A mobile phone display shows the icons of AI apps Deepseek, Chatgpt, Copilot, Perplexity and Gemini in Berlin, Germany, on Oct. 31, 2025. [EPA/YONHAP]
Despite the optimistic outlook of Minister Yoo, the AI Basic Act has caused quite a lot of turmoil, with a number of opposing views to this new policy expressed in the tech field.
In 2025, global leading AI developers such as OpenAI and Google requested flexibility in the law’s application due to the policy’s rigid strictness and extreme extraterritorial demands, according to industry insiders.
Specifically, they argue against the AI Basic Act’s definition of “high-impact AI” as a system that has a computational volume threshold of 10-to-the-26th floating-point operations (FLOPs) or greater. According to the companies, the FLOP threshold is a poor tool for actual risk calculation due to its rigid structure, which ignores how the AI is actually used, creating a Korea-specific compliance wall that does not align with preexisting global safety standards.
The AI Basic Act could lead to the breakdown of several AI models with its “blunt regulatory triggers,” prioritizing unnecessary safety checks that may limit the use of advanced AI models in Korea.
Furthermore, the Business Software Alliance, which represents key enterprises such as Microsoft, Adobe and Oracle, has been particularly critical of the unclear allocation of responsibility in the policy. It has formally requested that the Korean government revise the definition of high-impact AI as it believes that regulation should target specific cases rather than entire industries.
The main concern is that the act cannot distinguish between AI developers and AI deplorers, and this lack of distinction could freeze the business-to-business AI market, a crucial industry segment where AI models are developed, sold and applied to enhance operations and marketing between businesses.
However, the most deep-rooted opposition to the policy comes from the Startup Alliance, which represents Korea’s domestic ecosystem. Their primary worry is that the act could crush smaller firms before they can grow. A survey done by the organization was used in the Asia Business Daily to show how only 2 percent of startups are in the process of preparing for the act, with the other 98 percent either “unaware of the details and unprepared" or “aware of the law but not adequately responding.”
Startup founders have expressed disgust over Korea's status as the first country to enforce a comprehensive AI law, claiming that mandatory watermarks will drain the limited capital of local enterprises.
Suggestions by experts on legislation improvement
National Assembly speaker Woo Won-shik delivers a congratulatory address during a seminar titled “The status and challenges of the AI Basic Act” at the National Assembly Members’ Office in Yeouido, western Seoul, on June 17, 2025.
With the confusion surrounding several definitions included in the AI Basic Act, several changes are being requested by legal scholars and leading practitioners.
A paper by Park Sang-chul, a Seoul National University law school professor, called for the new legislation to be revised.
"The ‘horizontal regulatory system’ that treats AI as a single regulation does not fit reality,” he said to the Electronic Times.
The argument is that one single regulatory system applying uniformly to all AI is likely to eventually cause overregulation, so he suggests another approach: a regulatory framework tailored to industries and situations focused on AI utilization, similar to ones in the United States, China and Britain, which have prioritized AI development over consumption.
Prof. Park suggested that the legal term "high-impact AI" be amended to encompass a broader definition, such as "AI with significant impact on the rights of citizens."
Another refinement requested of the AI Basic Act was made by Choi Kyung-jin, a law professor and the president of the Korea Association for AI and Law, in the Digital Times, saying, “While efforts are necessary to reduce the potential for problems to arise, accumulating experience in law enforcement is also important. While prioritizing promotion, a balance should be found by implementing necessary regulations centered on actions and examples in each area.”
He notes that the act requires more precise subordinate rules and compliance mechanisms to be more effective and fairer.
Thus, when Korea implemented this policy, it was rewarded with the value of being first — but was also given the heavy responsibility of attempting something new. To make the AI Basic Act an opportunity and not an obstacle, Korea must now smooth the path it has paved.
BY YUN JI-MIN, SOPHIA CHO, JO AH-YOUNG, LEE SEUNG-JOON





with the Korea JoongAng Daily
To write comments, please log in to one of the accounts.
Standards Board Policy (0/250자)